翻訳と辞書
Words near each other
・ Markotów Duży
・ Markotów Mały
・ Markounda
・ Markout, Texas
・ Markov
・ Markov (crater)
・ Markov additive process
・ Markov algorithm
・ Markov blanket
・ Markov brothers' inequality
・ Markov chain
・ Markov chain approximation method
・ Markov chain geostatistics
・ Markov chain mixing time
・ Markov chain Monte Carlo
Markov chains on a measurable state space
・ Markov decision process
・ Markov information source
・ Markov kernel
・ Markov logic network
・ Markov model
・ Markov number
・ Markov partition
・ Markov perfect equilibrium
・ Markov process
・ Markov Processes International
・ Markov property
・ Markov random field
・ Markov renewal process
・ Markov reward model


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Markov chains on a measurable state space : ウィキペディア英語版
Markov chains on a measurable state space
A Markov chain on a measurable state space is a discrete-time-homogenous Markov chain with a measurable space as state space.
== History ==

The definition of Markov chains has evolved during the 20th century. In 1953 the term Markov chain was used for stochastic processes with discrete or continuous index set, living on a countable or finite state space, see Doob.〔Joseph L. Doob: ''Stochastic Processes''. New York: John Wiley & Sons, 1953.〕 or Chung.〔Kai L. Chung: ''Markov Chains with Stationary Transition Probabilities. Second edition. Berlin: Springer-Verlag, 1974.〕 Since the late 20th century it became more popular to consider a Markov chain as a stochastic process with discrete index set, living on a measurable state space.〔Sean Meyn and Richard L. Tweedie: ''Markov Chains and Stochastic Stability''. 2nd edition, 2009.〕〔Daniel Revuz: ''Markov Chains''. 2nd edition, 1984.〕〔Rick Durrett: ''Probability:Theory and Examples' Fourth edition, 2005.〕

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Markov chains on a measurable state space」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.